📚 node [[k median|k median]]
Welcome! Nobody has contributed anything to 'k median|k median' yet. You can:
  • Write something in the document below!
    • There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
  • Write to the Agora from social media.
    • If you follow Agora bot on a supported platform and include the wikilink [[k median|k median]] in a post, the Agora will link it here and optionally integrate your writing.
  • Sign up as a full Agora user.
    • As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
⥅ related node [[k median]]
⥅ node [[k-median]] pulled by Agora

k-median

Go back to the [[AI Glossary]]

#clustering

A clustering algorithm closely related to k-means. The practical difference between the two is as follows:

  • In k-means, centroids are determined by minimizing the sum of the squares of the distance between a centroid candidate and each of its examples.
  • In k-median, centroids are determined by minimizing the sum of the distance between a centroid candidate and each of its examples.

Note that the definitions of distance are also different:

  • k-means relies on the Euclidean distance from the centroid to an example. (In two dimensions, the Euclidean distance means using the Pythagorean theorem to calculate the hypotenuse.) For example, the k-means distance between (2,2) and (5,-2) would be:

$$ \text{Euclidian distance} = \sqrt{(2 -5)^2 + (2 - -2)^2} = 5 $$

  • k-median relies on the Manhattan distance from the centroid to an example. This distance is the sum of the absolute deltas in each dimension. For example, the k-median distance between (2,2) and (5,-2) would be:

$$ \text{Manhattan distance} = |2 - 5| + |2 - 2| = 7 $$

L

📖 stoas
⥱ context